When Human Coders (and Machines) Disagree on the Meaning of Facial Affect in Spontaneous Videos

نویسندگان

  • Mohammed E. Hoque
  • Rana El Kaliouby
  • Rosalind W. Picard
چکیده

This paper describes the challenges of getting ground truth affective labels for spontaneous video, and presents implications for systems such as virtual agents that have automated facial analysis capabilities. We first present a dataset from an intelligent tutoring application and describe the most prevalent approach to labeling such data. We then present an alternative labeling approach, which closely models how the majority of automated facial analysis systems are designed. We show that while participants, peers and trained judges report high inter-rater agreement on expressions of delight, confusion, flow, frustration, boredom, surprise, and neutral when shown the entire 30 minutes of video for each participant, inter-rater agreement drops below chance when human coders are asked to watch and label short 8 second clips for the same set of labels. We also perform discriminative analysis for facial action units for each affective state represented in the clips. The results emphasize that human coders heavily rely on factors such as familiarity of the person and context of the interaction to correctly infer a person’s affective state; without this information, the reliability of humans as well as machines attributing affective labels to spontaneous facial-head movements drops significantly.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Automated Facial Expression Recognition for Product Evaluation

Facial expression play an important role in signaling interest and opinion about a product. In this paper we present a negative affect classifier that detects nose wrinkles (AU9) or brow lowerer (AU4). Our approach uses an automatic mechanism for extracting the frames of interest in videos. We employ an automatic facial feature tracker to detect the area of interest in each frame, we then apply...

متن کامل

Facial Action Coding System

The Facial Action Coding System (FACS) is a widely used protocol for recognizing and labelling facial expression by describing the movement of muscles of the face. FACS is used to objectively measure the frequency and intensity of facial expressions without assigning any emotional meaning to those muscle movements. Instead FACS breaks down facial expressions into their smallest discriminable mo...

متن کامل

The meaning of human life and the creating role of religion

Why has man been created? Can human life be considered meaningful? What can be agreed with as the role of religion in making life meaningful? To answer the question one should study the concept of a meaningful life, life-giving agent and its characteristics. One can consider creativity as the agent giving meaning to life. By the creativity is meant such elements like knowledge, freedom and de...

متن کامل

A Nonlinear Grayscale Morphological and Unsupervised method for Human Facial Synthesis Based on an Example Image

Human facial generation of example image is used as a requirement for biometric applications for the purpose of identifying individuals. In this paper, face generation consists of three main steps. In the first step, detection of significant lines and edges of the example image are carried out using nonlinear grayscale morphology. Then, hair areas are identified from the face of sample. The fin...

متن کامل

Facial Expression Recognition Based on Anatomical Structure of Human Face

Automatic analysis of human facial expressions is one of the challenging problems in machine vision systems. It has many applications in human-computer interactions such as, social signal processing, social robots, deceit detection, interactive video and behavior monitoring. In this paper, we develop a new method for automatic facial expression recognition based on facial muscle anatomy and hum...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009